Least-squares trigonometric regression estimation

نویسندگان

چکیده

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Nonparametric regression estimation using penalized least squares

We present multivariate penalized least squares regression estimates. We use Vapnik{ Chervonenkis theory and bounds on the covering numbers to analyze convergence of the estimates. We show strong consistency of the truncated versions of the estimates without any conditions on the underlying distribution.

متن کامل

PEDOMODELS FITTING WITH FUZZY LEAST SQUARES REGRESSION

Pedomodels have become a popular topic in soil science and environmentalresearch. They are predictive functions of certain soil properties based on other easily orcheaply measured properties. The common method for fitting pedomodels is to use classicalregression analysis, based on the assumptions of data crispness and deterministic relationsamong variables. In modeling natural systems such as s...

متن کامل

Discrete Least Squares Approximation by Trigonometric Polynomials

We present an efficient and reliable algorithm for discrete least squares approximation of a real-valued function given at arbitrary distinct nodes in [0, 2tt) by trigonometric polynomials. The algorithm is based on a scheme for the solution of an inverse eigenproblem for unitary Hessenberg matrices, and requires only O(mn) arithmetic operations as compared with 0(mn ) operations needed for alg...

متن کامل

Least Squares Percentage Regression

Percentage error (relative to the observed value) is often felt to be more meaningful than the absolute error in isolation. The mean absolute percentage error (MAPE) is widely used in forecasting as a basis of comparison, and regression models can be fitted which minimize this criterion. Unfortunately, no formula exists for the coefficients, and models for a given data set may not be unique. We...

متن کامل

Compressed Least-Squares Regression

We consider the problem of learning, from K data, a regression function in a linear space of high dimensionN using projections onto a random subspace of lower dimension M . From any algorithm minimizing the (possibly penalized) empirical risk, we provide bounds on the excess risk of the estimate computed in the projected subspace (compressed domain) in terms of the excess risk of the estimate b...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Applicationes Mathematicae

سال: 1999

ISSN: 1233-7234,1730-6280

DOI: 10.4064/am-26-2-121-131